177 research outputs found

    Realizing Pico: Finally No More Passwords!

    Get PDF
    In 2011 Stajano proposed Pico, a secure and easy-to-use alternative for passwords. Among the many proposals in this category, Pico stands out by being creative and convincing. However, the description as published leaves some details unspecified, and to the best of our knowledge the complete system has not yet been tested. This work presents detailed specifications and future-proof security protocols for Pico. Moreover, we present the first robust and efficient Pico implementation. Our implementation allows to further mature the Pico concept and can be used for large scale usability evaluations at negligible cost

    Efficient Sparse Merkle Trees: Caching Strategies and Secure (Non-)Membership Proofs

    Get PDF
    A sparse Merkle tree is an authenticated data structure based on a perfect Merkle tree of intractable size. It contains a distinct leaf for every possible output from a cryptographic hash function, and can be simulated efficiently because the tree is sparse (i.e., most leaves are empty). We are the first to provide complete, succinct, and recursive definitions of a sparse Merkle tree and related operations. We show that our definitions enable efficient space-time trade-offs for different caching strategies, and that verifiable audit paths can be generated to prove (non-)membership in practically constant time (<4 ms) when using SHA-512/256. This is despite a limited amount of space for the cache---smaller than the size of the underlying data structure being authenticated---and full (concrete) security in the multi-instance setting

    Attacks on Karlsson and Mitrokotsa\u27s Grouping-Proof-Distance-Bounding Protocol

    Get PDF
    In the recent IEEE communication letter ``Grouping-Proof-Distance-Bounding Protocols: Keep All Your Friends Close by Karlsson and Mitrokotsa, a protocol for grouping-proof distance-bounding (GPDB) is proposed. In this letter, we show that the proof that is generated by the proposed GBDP protocol does not actually prove anything. Furthermore, we provide a construction towards a distance-bounding grouping-proof, however it remains unclear if one can ever truly combine (privacy-preserving) distance-bounding and a grouping-proof

    Exploring data sharing obligations in the technology sector

    Get PDF
    This report addresses the question: What is the role of data in the technology sector and what are the opportunities and risks of mandatory data sharing? The answer provides insights into costs and benefits of variants of data sharing obligations with and between technology companies

    A Survey on Lightweight Entity Authentication with Strong PUFs

    Get PDF
    Physically unclonable functions (PUFs) exploit the unavoidable manufacturing variations of an integrated circuit (IC). Their input-output behavior serves as a unique IC \u27fingerprint\u27. Therefore, they have been envisioned as an IC authentication mechanism, in particular the subclass of so-called strong PUFs. The protocol proposals are typically accompanied with two PUF promises: lightweight and an increased resistance against physical attacks. In this work, we review nineteen proposals in chronological order: from the original strong PUF proposal (2001) to the more complicated noise bifurcation and system of PUFs proposals (2014). The assessment is aided by a unied notation and a transparent framework of PUF protocol requirements

    Practical robustness evaluation in radiotherapy - A photon and proton-proof alternative to PTV-based plan evaluation

    Get PDF
    Background and purpose: A planning target volume (PTV) in photon treatments aims to ensure that the clinical target volume (CTV) receives adequate dose despite treatment uncertainties. The underlying static dose cloud approximation (the assumption that the dose distribution is invariant to errors) is problematic in intensity modulated proton treatments where range errors should be taken into account as well. The purpose of this work is to introduce a robustness evaluation method that is applicable to photon and proton treatments and is consistent with (historic) PTV-based treatment plan evaluations. Materials and methods: The limitation of the static dose cloud approximation was solved in a multi-scenario simulation by explicitly calculating doses for various treatment scenarios that describe possible errors in the treatment course. Setup errors were the same as the CTV-PTV margin and the underlying theory of 3D probability density distributions was extended to 4D to include range errors, maintaining a 90% confidence level. Scenario dose distributions were reduced to voxel-wise minimum and maximum dose distributions; the first to evaluate CTV coverage and the second for hot spots. Acceptance criteria for CTV D98 and D2 were calibrated against PTV-based criteria from historic photon treatment plans. Results: CTV D98 in worst case scenario dose and voxel-wise minimum dose showed a very strong correlation with scenario average D98 (R-2 > 0.99). The voxel-wise minimum dose visualised CTV dose conformity and coverage in 3D in agreement with PTV-based evaluation in photon therapy. Criteria for CTV D98 and D2 of the voxel-wise minimum and maximum dose showed very strong correlations to PTV D98 and D2 (R-2 > 0.99) and on average needed corrections of -0.9% and +2.3%, respectively. Conclusions: A practical approach to robustness evaluation was provided and clinically implemented for PTV-less photon and proton treatment planning, consistent with PTV evaluations but without its static dose cloud approximation. (C) 2019 The Authors. Published by Elsevier B.V

    A computational model of postprandial adipose tissue lipid metabolism derived using human arteriovenous stable isotope tracer data

    Get PDF
    Given the association of disturbances in non-esterified fatty acid (NEFA) metabolism with the development of Type 2 Diabetes and Non-Alcoholic Fatty Liver Disease, computational models of glucose-insulin dynamics have been extended to account for the interplay with NEFA. In this study, we use arteriovenous measurement across the subcutaneous adipose tissue during a mixed meal challenge test to evaluate the performance and underlying assumptions of three existing models of adipose tissue metabolism and construct a new, refined model of adipose tissue metabolism. Our model introduces new terms, explicitly accounting for the conversion of glucose to glyceraldehye-3-phosphate, the postprandial influx of glycerol into the adipose tissue, and several physiologically relevant delays in insulin signalling in order to better describe the measured adipose tissues fluxes. We then applied our refined model to human adipose tissue flux data collected before and after a diet intervention as part of the Yoyo study, to quantify the effects of caloric restriction on postprandial adipose tissue metabolism. Significant increases were observed in the model parameters describing the rate of uptake and release of both glycerol and NEFA. Additionally, decreases in the model’s delay in insulin signalling parameters indicates there is an improvement in adipose tissue insulin sensitivity following caloric restriction.</p

    One-fits-all pretreatment protocol facilitating Fluorescence in Situ Hybridization on formalin-fixed paraffin-embedded, fresh frozen and cytological slides

    Get PDF
    Background: The Fluorescence In Situ Hybridization (FISH) technique is a very useful tool for diagnostic and prognostic purposes in molecular pathology. However, clinical testing on patient tissue is challenging due to variables of tissue processing that can influence the quality of the results. This emphasizes the necessity of a standardized FISH protocol with a high hybridization efficiency. We present a pretreatment protocol that is easy, reproducible, cost-effective, and facilitates FISH on all types of patient material simultaneously with good quality results. During validation, FISH analysis was performed simultaneously on formalin-fixed paraffin-embedded, fresh frozen and cytological patient material in combination with commercial probes using our optimized one-fits-all pretreatment protocol. An optimally processed sample is characterized by strong specific signals, intact nuclear membranes, non-disturbing autofluorescence and a homogeneous DAPI staining. Results: In our retrospective cohort of 3881 patient samples, overall 93% of the FISH samples displayed good quality results leading to a patient diagnosis. All FISH were assessed on quality aspects such as adequacy and consistency of signal strength (brightness), lack of background and / or cross-hybridization signals, and additionally the presence of appropriate control signals were evaluated to assure probe accuracy. In our analysis 38 different FISH probes from 3 commercial manufacturers were used (Cytocell, Vysis and ZytoLight). The majority of the patients in this cohort displayed good signal quality and barely non-specific background fluorescence on all tissue types independent of which commercial probe was used. Conclusion: The optimized one-fits-all FISH method is robust, reliable and reproducible to deliver an accurate result for patient diagnostics in a lean workflow and cost-effective manner. This protocol can be used for widespread application in cancer and non-cancer diagnostics and research
    • …
    corecore